# RoBERTa Optimization

Hwtcmner
Apache-2.0
A BERT-fine-tuned named entity recognition model specifically designed for the TCM field, achieving leading performance in NER tasks within this domain.
Sequence Labeling Transformers Chinese
H
Monor
18
3
Phobert Base V2
PhoBERT is the most advanced pretrained language model for Vietnamese, optimized based on RoBERTa, and excels in various Vietnamese NLP tasks.
Large Language Model Transformers Other
P
vinai
54.89k
27
Chatgpt Detector Roberta Chinese
A RoBERTa-based Chinese ChatGPT-generated text detection model designed to distinguish between human-written and ChatGPT-generated text.
Text Classification Transformers Chinese
C
Hello-SimpleAI
368
24
Legal Roberta Large
A legal domain language model continuously pre-trained on the LeXFiles legal corpus based on the RoBERTa large model
Large Language Model Transformers English
L
lexlms
367
13
Relbert Roberta Large
RelBERT is a model based on RoBERTa-large, specifically designed for relation embedding tasks, trained on the SemEval-2012 Task 2 dataset using NCE (Noise Contrastive Estimation).
Text Embedding Transformers
R
relbert
97
2
Roberta Base Cuad Finetuned
RoBERTa model optimized for the Contract Understanding Atticus Dataset (CUAD) Q&A task, excelling in legal contract review tasks
Question Answering System Transformers English
R
gustavhartz
387
1
Simpledataset
Apache-2.0
A model fine-tuned based on distilroberta-base, with specific uses and training data not clearly stated
Large Language Model Transformers
S
DioLiu
174
0
Erlangshen Roberta 330M Sentiment
Apache-2.0
Optimized version fine-tuned on multiple sentiment analysis datasets based on Chinese RoBERTa-wwm-ext-large model
Text Classification Transformers Chinese
E
IDEA-CCNL
65.15k
34
Roberta Base Squad2
This is a RoBERTa-based extractive question answering model, specifically trained on the SQuAD 2.0 dataset, suitable for English Q&A tasks.
Question Answering System Transformers English
R
ydshieh
31
0
Roberta Base Fiqa Flm Sq Flit
A RoBERTa-base model fine-tuned for financial domain Q&A tasks, specifically designed for customized Q&A systems in banking, finance, and insurance sectors.
Question Answering System Transformers
R
vanadhi
205
1
Roberta Base MITmovie
A model trained for named entity recognition tasks using the MIT movie dataset based on the Roberta Base model
Sequence Labeling
R
thatdramebaazguy
34
1
Simcse Chinese Roberta Wwm Ext
A simplified Chinese sentence embedding encoding model based on simple contrastive learning, using the Chinese RoBERTa WWM extended version as the pre-trained model.
Text Embedding Transformers
S
cyclone
188
32
Xlm Roberta Base Finetuned Marc En
MIT
A multilingual text classification model fine-tuned on the amazon_reviews_multi dataset based on XLM-RoBERTa-base
Large Language Model Transformers
X
daveccampbell
29
0
Sup Simcse Roberta Large
Supervised SimCSE model based on RoBERTa-large for sentence embedding and feature extraction tasks.
Text Embedding
S
princeton-nlp
276.47k
25
Emoroberta
MIT
A fine-grained sentiment classification model based on the RoBERTa architecture, trained on the GoEmotions dataset, capable of recognizing 28 emotion categories.
Text Classification Transformers English
E
arpanghoshal
21.47k
119
Rubiobert
RuBioRoBERTa is a RoBERTa model pretrained for Russian biomedical text, specifically designed for natural language processing tasks in the biomedical domain.
Large Language Model Transformers
R
alexyalunin
686
1
Roberta Base On Cuad
MIT
A model fine-tuned on the RoBERTa-base for legal contract Q&A tasks, specifically designed for contract review
Question Answering System Transformers English
R
Rakib
14.79k
8
Roberta Base Finetuned Sst2
A text classification model based on the RoBERTa architecture, fine-tuned on the SST-2 sentiment analysis task with an accuracy of 94.5%
Text Classification Transformers
R
Bhumika
53
4
Roberta Base English Upos
This is a model based on the RoBERTa pre-trained model, using the UD_English dataset for POS tagging and dependency parsing.
Sequence Labeling Transformers Supports Multiple Languages
R
KoichiYasuoka
64
1
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase